Lasso, fractional norm and structured sparse estimation using a Hadamard product parametrization
نویسنده
چکیده
Using a multiplicative reparametrization, it is shown that a subclass of Lq penalties with q less than or equal to one can be expressed as sums of L2 penalties. It follows that the lasso and other norm-penalized regression estimates may be obtained using a very simple and intuitive alternating ridge regression algorithm. As compared to a similarly intuitive EM algorithm for Lq optimization, the proposed algorithm avoids some numerical instability issues and is also competitive in terms of speed. Furthermore, the proposed algorithm can be extended to accommodate sparse high-dimensional scenarios, generalized linear models, and can be used to create structured sparsity via penalties derived from covariance models for the parameters. Suchmodel-based penaltiesmaybe useful for sparse estimation of spatially or temporally structured parameters. © 2017 Published by Elsevier B.V.
منابع مشابه
A regularized matrix factorization approach to induce structured sparse-low-rank solutions in the EEG inverse problem
We consider the estimation of the Brain Electrical Sources (BES) matrix from noisy electroencephalographic (EEG) measurements, commonly named as the EEG inverse problem. We propose a new method to induce neurophysiological meaningful solutions, which takes into account the smoothness, structured sparsity, and low rank of the BES matrix. The method is based on the factorization of the BES matrix...
متن کاملEstimation Error of the Lasso
This paper presents an upper bound for the estimation error of the constrained lasso, under the high-dimensional (n < p) setting. In contrast to existing results, the error bound in this paper is sharp, is valid when the parameter to be estimated is not exactly sparse (e.g., when the parameter is weakly sparse), and shows explicitly the effect of over-estimating the `1-norm of the parameter to ...
متن کاملA general framework for fast stagewise algorithms
Forward stagewise regression follows a very simple strategy for constructing a sequence of sparse regression estimates: it starts with all coefficients equal to zero, and iteratively updates the coefficient (by a small amount ) of the variable that achieves the maximal absolute inner product with the current residual. This procedure has an interesting connection to the lasso: under some conditi...
متن کاملCertain subclass of $p$-valent meromorphic Bazilevi'{c} functions defined by fractional $q$-calculus operators
The aim of the present paper is to introduce and investigate a new subclass of Bazilevi'{c} functions in the punctured unit disk $mathcal{U}^*$ which have been described through using of the well-known fractional $q$-calculus operators, Hadamard product and a linear operator. In addition, we obtain some sufficient conditions for the func...
متن کاملc-LASSO and its dual for sparse signal estimation from array data
We treat the estimation of a sparse set of sources emitting plane waves observed by a sensor array as a complex-valued LASSO (c–LASSO) problem where the usual l1-norm constraint is replaced by the l1-norm of a matrix D times the solution vector. When the sparsity order is given, algorithmically selecting a suitable value for the c–LASSO regularization parameter remains a challenging task. The c...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Computational Statistics & Data Analysis
دوره 115 شماره
صفحات -
تاریخ انتشار 2017